Accelerated inexact composite gradient methods for nonconvex spectral optimization problems

نویسندگان

چکیده

This paper presents two inexact composite gradient methods, one inner accelerated and another doubly accelerated, for solving a class of nonconvex spectral optimization problems. More specifically, the objective function these problems is form $$f_{1}+f_{2}+h$$ , where $$f_{1}$$ $$f_{2}$$ are differentiable matrix functions with Lipschitz continuous gradients, $$h$$ proper closed convex function, both can be expressed as that operate on singular values their inputs. The methods essentially use an method to solve sequence proximal subproblems involving linear approximation value underlying . Unlike other gradient-based proposed take advantage structure in order efficiently generate solutions. Numerical experiments presented demonstrate practicality set real-world randomly generated

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Accelerated Proximal Gradient Methods for Nonconvex Programming

Nonconvex and nonsmooth problems have recently received considerable attention in signal/image processing, statistics and machine learning. However, solving the nonconvex and nonsmooth optimization problems remains a big challenge. Accelerated proximal gradient (APG) is an excellent method for convex programming. However, it is still unknown whether the usual APG can ensure the convergence to a...

متن کامل

Efficient Inexact Proximal Gradient Algorithm for Nonconvex Problems

The proximal gradient algorithm has been popularly used for convex optimization. Recently, it has also been extended for nonconvex problems, and the current state-of-the-art is the nonmonotone accelerated proximal gradient algorithm. However, it typically requires two exact proximal steps in each iteration, and can be inefficient when the proximal step is expensive. In this paper, we propose an...

متن کامل

Accelerated gradient methods for nonconvex nonlinear and stochastic programming

In this paper, we generalize the well-known Nesterov’s accelerated gradient (AG) method, originally designed for convex smooth optimization, to solve nonconvex and possibly stochastic optimization problems. We demonstrate that by properly specifying the stepsize policy, the AG method exhibits the best known rate of convergence for solving general nonconvex smooth optimization problems by using ...

متن کامل

Inexact proximal stochastic gradient method for convex composite optimization

We study an inexact proximal stochastic gradient (IPSG) method for convex composite optimization, whose objective function is a summation of an average of a large number of smooth convex functions and a convex, but possibly nonsmooth, function. Variance reduction techniques are incorporated in the method to reduce the stochastic gradient variance. The main feature of this IPSG algorithm is to a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Optimization and Applications

سال: 2022

ISSN: ['0926-6003', '1573-2894']

DOI: https://doi.org/10.1007/s10589-022-00377-9